Predicting the Learning Rate of Gradient Descent for Accelerating Matrix Factorization
نویسندگان
چکیده
Matrix Factorization (MF) has become the predominant technique in recommender systems. The model parameters are usually learned by means of numerical methods, such as gradient descent. The learning rate of gradient descent is typically set to lower values in order to ensure that the algorithm will not miss a local optimum. As a consequence, the algorithm may take several iterations to converge. Ideally, one wants to find the learning rate that will lead to a local optimum in the first iterations, but that is very difficult to achieve given the high complexity of the search space. Starting with an exploratory analysis on several recommender systems datasets, we observed that there is an overall linear relationship between the learning rate and the number of iterations needed until convergence. Another key observation is that this relationship holds across the different recommender datasets chosen. From this, we propose to use simple linear regression models for predicting, for an unknown dataset, a good learning rate to start with. The idea is to estimate a learning rate that will get us as close as possible to a local optimal in the first iteration, without overshooting it. We evaluate our approach on 8 real-world recommender datasets and compare it against the standard learning algorithm, that uses a fixed learning rate, and adaptive learning rate strategies from the literature. We show that, for some datasets, we can reduce the number of iterations up to 40% when compared to the standard approach.
منابع مشابه
Online Passive-Aggressive Algorithms for Non-Negative Matrix Factorization and Completion
Stochastic Gradient Descent (SGD) is a popular online algorithm for large-scale matrix factorization. However, SGD can often be di cult to use for practitioners, because its performance is very sensitive to the choice of the learning rate parameter. In this paper, we present non-negative passiveaggressive (NN-PA), a family of online algorithms for non-negative matrix factorization (NMF). Our al...
متن کاملA Novel Fuzzy Based Method for Heart Rate Variability Prediction
Abstract In this paper, a novel technique based on fuzzy method is presented for chaotic nonlinear time series prediction. Fuzzy approach with the gradient learning algorithm and methods constitutes the main components of this method. This learning process in this method is similar to conventional gradient descent learning process, except that the input patterns and parameters are stored in mem...
متن کاملAccelerating Stochastic Gradient Descent via Online Learning to Sample
Stochastic Gradient Descent (SGD) is one of the most widely used techniques for online optimization in machine learning. In this work, we accelerate SGD by adaptively learning how to sample the most useful training examples at each time step. First, we show that SGD can be used to learn the best possible sampling distribution of an importance sampling estimator. Second, we show that the samplin...
متن کاملBPR: Bayesian Personalized Ranking from Implicit Feedback
Item recommendation is the task of predicting a personalized ranking on a set of items (e.g. websites, movies, products). In this paper, we investigate the most common scenario with implicit feedback (e.g. clicks, purchases). There are many methods for item recommendation from implicit feedback like matrix factorization (MF) or adaptive knearest-neighbor (kNN). Even though these methods are des...
متن کاملDesigning stable neural identifier based on Lyapunov method
The stability of learning rate in neural network identifiers and controllers is one of the challenging issues which attracts great interest from researchers of neural networks. This paper suggests adaptive gradient descent algorithm with stable learning laws for modified dynamic neural network (MDNN) and studies the stability of this algorithm. Also, stable learning algorithm for parameters of ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- JIDM
دوره 5 شماره
صفحات -
تاریخ انتشار 2014